seo

The Power of Basic Technical SEO: Case Study

I’m a big fan of strategic thinking. In fact, most of what I write about related to SEO and digital marketing flows from a strategic viewpoint. I’m the guy who clients (try to) hire for SEO, then ends up telling them to first do better persona development and refine their brand. It’s just how my brain functions. 

As a result of my strategy fetish, I tend to, well, loathe posts that simply focus on tactical implementation. I find that many marketers aren’t really ready for tactical implementation because they don’t know why they are really doing what they are doing, other than “it’s my job.” I believe this poses a sincere risk for business owners who hear about the latest tactics and then attempt to bolt them onto their current marketing activities with little strategic thinking guiding the decision. This leads to missed opportunities at best, and other things we won’t discuss in this post at worst.

However, technical SEO, especially the basics, often make my ranting look foolish. Why? Because the recommendations are black and white best practices that can yield powerful results when implemented. No strategic thinking required. You’ve either got it right, or you don’t. If you don’t, you’re likely hurting yourself unnecessarily. If you do get it right/fix the issues, you can unlock potential in your site as a marketing tool that you never knew you had—even in this modern day of search.

This post is designed to help business owners, developers and those new to SEO understand the impact that small technical changes can have on a business’s bottom line. There are multiple agencies (including my own) that make good and honest money by solving these problems for brands. In this way, SEOs are like mechanics in the services they provide.

Let’s take a look at what can happen when you dial in a few basic best practices of technical SEO.

The Situation

A few months ago, I started SEO work for a small business (under $10 million in revenue with about 50 employees) that operates in a niche e-commerce space. In many ways, they’re fortunate enough to be considered the “bully” on the block. Their organic traffic was consistently showing 80-90% year-over-year growth each month and revenue was rising along with it. Their brand is known and favored in their niche, and success was clearly driving more success.

I was tasked with coaxing more growth from the organic channel and helping the company realize their full potential. No problem (gulp). Who couldn’t improve numbers like that? Truth is, I tend to like situations like this, as the current data is indicative of a business doing things well and operating in a good space. There’s always opportunity for improvement.

The Issues

I started with a deep technical SEO audit, as always. Early on, a quick visual inspection highlighted multiple issues that could be easily fixed for quick wins.

1) Internal linking to non-canonical pages that were 301 redirected to canonical pages

internal-links-non-canonical.png

Category Landing Page

internal-links-non-canonical-2.png

Individual Product Page (related products and projects listed at the bottom of the page)

This problem can be identified visually by doing the following. (Tools help, but aren’t necessary.)

  1. Navigating to the page under inspection in the Chrome browser
    1. Placing your mouse over different page elements that link to other pages/site content
  2. Looking near the bottom left of the browser and taking note of the URL displayed (this is telling you where the current link you’re on top of is pointing)
  3. Clicking the link and comparing the URL you just looked at to the one you see at the top of your browser in the address bar

If they match, you should be in good shape. If they don’t, you’ll likely need to do some deeper digging or talk with a search expert. 

If you’ve crawled the site using Screaming Frog or a similar tool, you can also check for redirect status codes (3xx) on any URLs crawled. If you find some, check the internal linking to the page. (Here’s more quick insight into how to do that with Screaming Frog, specifically.) 

Common problems resulting from this issue are:

  • Loss of link equity
  • Duplicate content
  • Poor indexation

The fix is simple: Get your internal links pointing to their respective canonicals.

2) Thin and duplicate/near-duplicate content

duplication-thin-content-2.png

Gallery Landing Page

A lot of SEOs, savvy business owners and developers hear “thin content” or even “duplicate content” these days and immediately assume the search issue comes from the possibility of being attacked by a zoo animal. While it certainly appears that Panda and other algorithms take note of sites with thin content and may adjust a site’s rankings accordingly, thin or duplicate pages can simply become an issue because when search engines have trouble knowing which page to rank, the link equity is dispersed. 

While identifying this problem can vary greatly based on the situation, in this instance it wasn’t hard to find. One of the first things any seasoned SEO will look at when evaluating the health of an e-commerce site (or sites with similar sorting/filtering content sections) is whether or not they have sorted and/or faceted navigation. These little boogers can be very helpful for users, but cause tricky situations for crawl-happy engines. In this case, an image-driven section of the site used to grab top of funnel organic visits from related searches (that often convert down the road) was not well-designed or developed.

This problem can be identified by simply playing with a target page on the site with sort and facet navigation. Paying attention to what happens to the URLs as you change different things around will often reveal if you have a problem.

For example, let’s say you allow users to refine products in a category by size and color. In many cases, site developers allow users to select color and then limit by size. They also may allow a user starting on the same page to first select a size and then narrow by color. That’s all it takes to have a problem with duplication. 

The typical fix in such instances is often not simple. You need to consider your target market, keyword research, technical constraints and some other situational factors before deciding on a course of action. I’d highly suggest you talk with an SEO expert if you aren’t one, as a mistake here can quickly send your revenue in the wrong direction.

There are a number of guides that help you think through these issues and explain them in more depth already available. Personally, I’m a fan of Adam Audette for his technical suggestions in these situations, but many Moz folks such as Dr. Pete have displayed their knowledge in related subjects and are (of course) just as sharp in this area. Essentially, while there’s no need to reinvent the wheel, the devil is in knowing what to look for and what solution to apply in each situation.

The Solutions

Let’s take a look at how these specific situations were solved and the thinking behind each.

Problem 1: Internal linking to non-canonical pages that were 301-redirected to canonical pages

As I mentioned above, this is a relatively easy fix, err.., at least logically simple. One of the best (and sometimes worst) things about modern websites is how well they scale. A change here quickly populates across every page. It’s awesome, unless you do something wrong :). 

To solve this specific instance of non-canonical internal linking, I simply used the inspect element and view source functionality in Chrome to look for the link text that matched what I was seeing in my browser. I then clicked on the link and compared what I just saw to the address I saw at the top of my browser. Clearly, they didn’t match and the links needed to be updated.

Notes made, sent to developer, fix implemented, world saved.

I will add that I confirmed the type of redirect just to make 100% sure things were functioning as I thought. I confirmed that the URL I was seeing in the HTML was indeed running through a redirect using cURL commands in Terminal. 

A simple curl –head www.enteryoursite.com/entertheurl/ in Terminal will show you the redirect type and the ultimate destination. A simple example of this solution was used with the old SEOMoz OSE site:

Terminal.png

Problem 2: Thin and duplicate/near duplicate content

Dealing with faceted navigation and sorts can be significantly more complicated—especially with enterprise-level e-commerce sites. In this case, the situation was relatively straightforward (relatively being the key word here). Major items I considered while working through the specific situation I faced included:

  • Technical constraints – I needed to understand what solutions were possible for the developers to implement, and which ones weren’t (i.e., was it easy for developers to implement canonical tags? Are they able to tightly control redirects and use all types?) These seem like “gimmies.” but sometimes various factors make one technical “tool” harder to use than another.
  • Crawl budget – I wanted to think through the risk of allowing bots to crawl all the pages that could be created, as well as the potential impact of cutting some pages off completely. Note: most of the time, there isn’t going to be a great reason to really “cut off” a big batch of pages from crawling, but it’s important to think through your situation completely and consider all options in light of your objective.
  • User experience – A big part of deciding how to handle sorts and facets should come down to how your users are going to be connecting with your site through search. You obviously don’t want to cut off pages that might be great search results for users (especially if you’re seeing data to indicate there’s search volume for terms related to the content on the page), but of equal (and arguably more) importance, you want to make sure link equity and other signals are consolidated so that the best pages rank for the topics and terms your users are searching for.
  • Link equity – I wanted to make sure I didn’t limit or waste link equity from internal or external links. 
  • Current traffic and rankings – It would be a real shame to do something that seems like a good/ideal solution, but have it go against how well certain pages are driving traffic. 

Let’s take a brief look at how I dove into each of these items, what I learned, and how I chose a final course of action.

Technical constraints – I connected with the development team and asked them how difficult it would be to implement rule-based 301 redirects, canonical tags and other “tools” I might possibly want to use such as nofollow link tags or meta robots instructions. Based on the discussion, it was clear that everything was pretty much fair game.

Crawl budget – In this situation, I decided not to worry about any crawl control, as the site had a decent link profile (link equity impacts crawl budget in theory) and there were very few (if not zero) pages that I wouldn’t be consolidating using either a canonical or redirect. I also personally don’t worry too much about crawl budget unless there are a bunch of pages that really don’t make sense to have search engines crawling for some reason. My experience has shown me that engines are continually getting better and better at appropriately utilizing signals we leave and focusing their crawl on juicy content. Nothing major to worry about here. 

User experience and current traffic and rankings – Topic and keyword research is powerful. I used a combination of the Adwords Keyword Planner, suggested and related search (Google) information, and tools like ubersuggest.com to understand what users were looking for related to my client’s content on this section of the site. I wanted to know if various sorts and facets were potentially creating valuable pages that I should figure out how to keep as canonicals.

A few example query modifiers that represent common facets are:

  • Temporal – “newest…” – “newest women’s running shoes”
  • Opinion – “best rated…” – “top women’s running shoes”
  • Color – “orange…” – “orange women’s running shoes”
  • Size – “small…” – “women’s hiking shorts small”

While I have no idea if any of the example queries above have search volume, it does show how you might be able to best answer a user query with a page that your site creates when a user adjusts sorts and facets. It’s likely that in many industries where facets and sorts make sense, there’s probably some search volume involved related to those modifiers. After all, if people are pulling up the pages on the site, why would they not just be searching for them to start with? Understanding this concept is crucial to making wise decisions. 

I’ll also note that I utilized some basic persona research and did some of my own digging on social networks, magazine sites, related business sites, etc., to see how people were talking about these topics and utilizing related phrases/language/keywords.

After gathering a good understanding of how users were looking for information, I then dug into the available analytics and GWT data in order to understand how engines were currently prioritizing URLs. Remember, with duplicate content you’re forcing engines to make choices. Sometimes it’s very clear which choice they are leaning toward; other times it’s not.

I utilized the advanced dimension filtering in Google Analytics to examine organic visits to the section of the site I was working within. 

Here’s a quick visual of that process:

GA.png

I paid specific attention to which URL structure seemed to be:

  • Garnering the most visits
  • Generating good user engagement
  • Resulting in important goal completions 

In Google Webmaster Tools, I utilized the Search Queries -> Top Pages report to see which version of the landing pages within this section of the site seemed to be most prevalent for important queries. 

Essentially I asked this simple question: “Are more visits coming to URL version A, B, or C, or… ?” You get the idea. 

An example set you might have is this:

After completing my user analysis, I was able to two things:

  • Understand the types of queries searchers were using to find information on this section of the site and how they were structuring and modifying those queries. I was also able to better understand the connections search engines were seeing between queries and topics. 
  • Map these query types to the best possible version of each landing page based on current search exposure and traffic data. 

At this point, I had a fairly strong sense for which types of pages I would need to keep as canonical and what the actual version/URL structure should look like for those pages. During my analytics and data review, I had also kept my eyes open for odd pages that were breaking the mold for some reason. In this situation, what engines were favoring was fairly clear, with only a few exceptions.

Link equity – After walking through the items, my understanding of how I needed to combine pages, that I wasn’t going to worry about restricting crawl, and that I was largely “free” from a technical standpoint to utilize rel=canonical tags, 301-redirects, or some combination of those to consolidate pages had me feeling pretty good. 

The link equity check simply involved me utilizing Ahrefs and MajesticSEO (specifically the features which allow you to pull link equity at the directory/sub-folder level) to understand if there were certain URLs or URL structures that had gained a large portion of link equity for some reason. 

In this scenario, few of the URL groups or individual URLs I was working with had large amounts of external links or total link equity to worry about. If I had found something, I would have considered this in light of my choices above, as I hate to run good link equity through redirects or canonical tags. Nothing caused me to change course as a result of my analysis.

Wrapping it Up – As a result of my deep analysis on problem two, I was able to confidently provide a data and user-driven solution that consolidated important search signals and removed duplicate URLs from the index. I was also able to update old site maps and provide a guide for the addition of future content (pages) in the future. 

In this specific situation, I recommended the use of canonical tags across what were now determined to be duplicate pages. 

Many technical SEOs will go straight to 301 redirects in a case like I described above, especially if things line up neatly and there appears to be little risk. I, however, tend to be very cautious when implementing large sweeping technical recommendations s on high-value sections of a client’s site. I opted for canonicals as I wanted to observe how search engines decided to treat the recommendations instead of forcing them to adjust right away. 

This provided me insight into whether they viewed the decisions as “the best choice” while not forcing a change to only role it back if something went wrong.

Cautionary note: A wrong step in this area can negatively impact sites and might take a long time (sometimes months) to recover from. All adjustments, but especially forced direction ones, always carry some level of risk. It’s important to understand the potential impact and plan accordingly.

The Results

Now for the good stuff. I addressed these two issues with the client’s development team, and here’s what came out of it:

results.png

Organic Traffic with Month-Over-Month Comparison (E-Commerce data from previous year was unavailable)

While a picture is worth a thousand words, sometimes it’s good to have a few words to go with the picture.

The image above highlights the positive impact that fixing the issues we discussed above had on the client’s key performance indicators (KPI’s) within the organic channel, specifically:

  • Sessions (visits to the site) were up by ~33% over the previous month (and up over 104% year-over-year)
    • This speaks to increased visibility within the organic channel and correlated well with increases in impressions and average ranking for various keyword buckets being tracked. 
    • Many increases in visits also came to URLs that would have been directly impacted by the provided recommendations.
  • Transactions from organic search channels were up ~30% month-over-month. This indicates the increase in traffic was likely driving more sales from the organic channel. I say “likely” because narrowing this change down to direct causation would be tricky (and frankly, unnecessary in this situation).
  • Along with increased transactions from organic visitors, we saw a ~31% jump in revenue month-over-month. 

It’s worth mentioning that for this study, I didn’t pull data reflecting the impact of these changes on assisted conversions form the organic channel. Essentially, the impact is even more pronounced/significant when this data is included. If you’re attempting to showcase the impact of your changes to a client, don’t forget to include attribution data when possible!

I’d also like to note that I highly suggest tying the impact of what you’ve done to business revenue or larger business goals. This isn’t always possible with search tactic implementation (sometimes you’re one or two steps removed from direct impact and must adjust more elements before changes can be seen), but it makes life much, much easier if you can :).

The Catch

If only life were so simple that you could implement these changes and see massive results every time. This used to be (more) the case in the good ‘ol days of SEO, when engines were less sophisticated and brands were behind the times. But that’s all changed. 

In order for technical changes to drive big impact, you need to have the other building blocks of a strong organic presence in place.

It’s important to consider things like the following:

  • Brand strength (take a look at how many people search for your branded terms vs. competitors using free search tools for a quick proxy)
  • Off-site signals such as links, mentions, and social discussions, etc.
  • On-page elements such as well-developed content meeting user needs, well-targeted titles and meta descriptions to help with CTR and topic association, valuable product/information, and users wants)
  • User engagement signals such as CTR, bounce rates, conversions, mobile vs. desktop usage and experience

This is not an extensive list and you can go very deep into all of these, but the point is that basic technical fixes can work if (and it’s a big if) you’ve got the goods to back it up. 

No matter what your situation, ensuring a strong technical foundation for your website is crucial to maximizing its potential.

TL;DR

The fundamentals of technical SEO are powerful and can have a dramatic impact on your bottom line. Don’t ignore them and make sure you consult a search expert if you’re uncomfortable with getting things right on your own. This applies to SEOs who are less familiar with this aspect of the craft, too.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button